Generalization with Componential Attractors: Word and Nonword Reading in an Attractor Network

نویسندگان

  • David C. Plaut
  • James L. McClelland
چکیده

Networks that learn to make familiar activity patterns into stable attractors have proven useful in accounting for many aspects of normal and impaired cognition. However, their ability to generalize is questionable, particularly in quasiregular tasks that involve both regularities and exceptions, such as word reading. We trained an attractor network to pronounce virtually all of a large corpus of monosyllabic words, including both regular and exception words. When tested on the lists of pronounceable nonwords used in several empirical studies, its accuracy was closely comparable to that of human subjects. The network generalizes because the attractors it developed for regular words are componential—they have substructure that reflects common sublexical correspondences between orthography and phonology. This componentiality is faciliated by the use of orthographic and phonological representations that make explicit the structured relationship between written and spoken words. Furthermore, the componential attractors for regular words coexist with much less componential attractors for exception words. These results demonstrate that attractors can support effective generalization, challenging “dual-route” assumptions that multiple, independent mechanisms are required for quasiregular tasks.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Methods for Learning Articulated Attractors over Internal Representations

Recurrent attractor networks have many virtues which have prompted their use in a wide variety of connectionist cognitive models. One of these virtues is the ability of these networks to learn articulated attractors — meaningful basins of attraction arising from the systematic interaction of explicitly trained patterns. Such attractors can improve generalization by enforcing “well formedness” c...

متن کامل

Schemata-Building Role of Teaching Word History in Developing Reading Comprehension Ability

Methodologically, vocabulary instruction has faced significant ups and downs during the history of language education; sometimes integrated with the other elements of language network, other times tackled as a separate component. Among many variables supposedly affecting vocabulary achievement, the role of teaching word history, as a schemata-building strategy, in developing reading comprehensi...

متن کامل

On the Upper Semicontinuity of Cocycle Attractors for Non-autonomous and Random Dynamical Systems

In this paper we prove an upper semicontinuity result for perturbations of cocycle attractors. In particular, we study relationship between non-autonomous and global attractors. In this sense, we show that the concept of a cocycle attractor is a sensible generalization to non-autonomous and random dynamical systems of that of the global attractor.

متن کامل

On Riddling and Weak Attractors

We propose a general deenition for riddling of subset V of R m with nonzero Le-besgue measure and show that is appropriate for large class of dynamical systems. We introduce the concept of a weak attractor, a weaker notion than a Milnor attractor and use this to reexamine and classify riddled basins of attractors. We nd that basins of attraction can be partially riddled but if this is the case ...

متن کامل

In Search Of Articulated Attractors

Recurrent attractor networks offer many advantages over feedforward networks for the modeling of psychological phenomena. Their dynamic nature allows them to capture the time course of cognitive processing, and their learned weights may often be easily interpreted as soft constraints between representational components. Perhaps the most significant feature of such networks, however, is their ab...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1993